Telegram Group & Telegram Channel
๐Ÿ’  Compositional Learning Journal Club

Join us this week for an in-depth discussion on Data Unlearning in Deep generative models in the context of cutting-edge generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle unlearning tasks and where improvements can be made.

โœ… This Week's Presentation:

๐Ÿ”น Title: Data Unlearning in Diffusion Models


๐Ÿ”ธ Presenter: Aryan Komaei

๐ŸŒ€ Abstract:
Diffusion models have been shown to memorize and reproduce training data, raising legal and ethical concerns regarding data privacy and copyright compliance. While retraining these models from scratch to remove specific data is computationally costly, existing unlearning methods often rely on strong assumptions or exhibit instability. To address these limitations, we introduce a new family of loss functions called Subtracted Importance Sampled Scores (SISS). SISS leverages importance sampling to provide the first method for data unlearning in diffusion models with theoretical guarantees.

Session Details:
- ๐Ÿ“… Date: Tuesday
- ๐Ÿ•’ Time: 4:45 - 5:45 PM
- ๐ŸŒ Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! โœŒ๏ธ



tg-me.com/RIMLLab/196
Create:
Last Update:

๐Ÿ’  Compositional Learning Journal Club

Join us this week for an in-depth discussion on Data Unlearning in Deep generative models in the context of cutting-edge generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle unlearning tasks and where improvements can be made.

โœ… This Week's Presentation:

๐Ÿ”น Title: Data Unlearning in Diffusion Models


๐Ÿ”ธ Presenter: Aryan Komaei

๐ŸŒ€ Abstract:
Diffusion models have been shown to memorize and reproduce training data, raising legal and ethical concerns regarding data privacy and copyright compliance. While retraining these models from scratch to remove specific data is computationally costly, existing unlearning methods often rely on strong assumptions or exhibit instability. To address these limitations, we introduce a new family of loss functions called Subtracted Importance Sampled Scores (SISS). SISS leverages importance sampling to provide the first method for data unlearning in diffusion models with theoretical guarantees.

Session Details:
- ๐Ÿ“… Date: Tuesday
- ๐Ÿ•’ Time: 4:45 - 5:45 PM
- ๐ŸŒ Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! โœŒ๏ธ

BY RIML Lab




Share with your friend now:
tg-me.com/RIMLLab/196

View MORE
Open in Telegram


RIML Lab Telegram | DID YOU KNOW?

Date: |

Telegram Auto-Delete Messages in Any Chat

Some messages arenโ€™t supposed to last forever. There are some Telegram groups and conversations where itโ€™s best if messages are automatically deleted in a day or a week. Hereโ€™s how to auto-delete messages in any Telegram chat. You can enable the auto-delete feature on a per-chat basis. It works for both one-on-one conversations and group chats. Previously, you needed to use the Secret Chat feature to automatically delete messages after a set time. At the time of writing, you can choose to automatically delete messages after a day or a week. Telegram starts the timer once they are sent, not after they are read. This wonโ€™t affect the messages that were sent before enabling the feature.

Telegram announces Search Filters

With the help of the Search Filters option, users can now filter search results by type. They can do that by using the new tabs: Media, Links, Files and others. Searches can be done based on the particular time period like by typing in the date or even โ€œYesterdayโ€. If users type in the name of a person, group, channel or bot, an extra filter will be applied to the searches.

RIML Lab from kr


Telegram RIML Lab
FROM USA